Blog posts tagged with 'html 5 and seo'

RSS
Duplicate Content Filter: What it is and how it works - Tuesday, September 06, 2011

Duplicate Content has become a huge topic of discussion lately, thanks to the new filters that search engines have implemented. This article will help you understand why you might be caught in the filter, and ways to avoid it. We'll also show you how you can determine if your pages have duplicate content, and what to do to fix it.

Search engine spam is any deceitful attempts to deliberately trick the search engine into returning inappropriate, redundant, or poor-quality search results. Many times this behavior is seen in pages that are exact replicas of other pages which are created to receive better results in the search engine. Many people assume that creating multiple or similar copies of the same page will either increase their chances of getting listed in search engines or help them get multiple listings, due to the presence of more keywords.

In order to make a search more relevant to a user, search engines use a filter that removes the duplicate content pages from the search results, and the spam along with it. Unfortunately, good, hardworking webmasters have fallen prey to the filters imposed by the search engines that remove duplicate content. It is those webmasters who unknowingly spam the search engines, when there are some things they can do to avoid being filtered out. In order for you to truly understand the concepts you can implement to avoid the duplicate content filter, you need to know how this filter works.

First, we must understand that the term "duplicate content penalty" is actually a misnomer. When we refer to penalties in search engine rankings, we are actually talking about points that are deducted from a page in order to come to an overall relevancy score. But in reality, duplicate content pages are not penalized. Rather they are simply filtered, the way you would use a sieve to remove unwanted particles. Sometimes, "good particles" are accidentally filtered out.

Knowing the difference between the filter and the penalty, you can now understand how a search engine determines what duplicate content is. There are basically four types of duplicate content that are filtered out:

  1. Websites with Identical Pages - These pages are considered duplicate, as well as websites that are identical to another website on the Internet are also considered to be spam. Affiliate sites with the same look and feel which contain identical content, for example, are especially vulnerable to a duplicate content filter. Another example would be a website with doorway pages. Many times, these doorways are skewed versions of landing pages. However, these landing pages are identical to other landing pages. Generally, doorway pages are intended to be used to spam the search engines in order to manipulate search engine results.
  2. Scraped Content - Scraped content is taking content from a web site and repackaging it to make it look different, but in essence it is nothing more than a duplicate page. With the popularity of blogs on the internet and the syndication of those blogs, scraping is becoming more of a problem for search engines.
  3. E-Commerce Product Descriptions - Many eCommerce sites out there use the manufacturer's descriptions for the products, which hundreds or thousands of other eCommerce stores in the same competitive markets are using too. This duplicate content, while harder to spot, is still considered spam.
  4. Distribution of Articles - If you publish an article, and it gets copied and put all over the Internet, this is good, right? Not necessarily for all the sites that feature the same article. This type of duplicate content can be tricky, because even though Yahoo and MSN determine the source of the original article and deems it most relevant in search results, other search engines like Google may not, according to some experts.

So, how does a search engine's duplicate content filter work? Essentially, when a search engine robot crawls a website, it reads the pages, and stores the information in its database. Then, it compares its findings to other information it has in its database. Depending upon a few factors, such as the overall relevancy score of a website, it then determines which are duplicate content, and then filters out the pages or the websites that qualify as spam. Unfortunately, if your pages are not spam, but have enough similar content, they may still be regarded as spam.

There are several things you can do to avoid the duplicate content filter. First, you must be able to check your pages for duplicate content. Using our Similar Page Checker, you will be able to determine similarity between two pages and make them as unique as possible. By entering the URLs of two pages, this tool will compare those pages, and point out how they are similar so that you can make them unique.

Since you need to know which sites might have copied your site or pages, you will need some help. We recommend using a tool that searches for copies of your page on the Internet: www.copyscape.com. Here, you can put in your web page URL to find replicas of your page on the Internet. This can help you create unique content, or even address the issue of someone "borrowing" your content without your permission.

Let's look at the issue regarding some search engines possibly not considering the source of the original content from distributed articles. Remember, some search engines, like Google, use link popularity to determine the most relevant results. Continue to build your link popularity, while using tools like www.copyscape.com to find how many other sites have the same article, and if allowed by the author, you may be able to alter the article as to make the content unique.

If you use distributed articles for your content, consider how relevant the article is to your overall web page and then to the site as a whole. Sometimes, simply adding your own commentary to the articles can be enough to avoid the duplicate content filter; the Similar Page Checker could help you make your content unique. Further, the more relevant articles you can add to compliment the first article, the better. Search engines look at the entire web page and its relationship to the whole site, so as long as you aren't exactly copying someone's pages, you should be fine.

If you have an eCommerce site, you should write original descriptions for your products. This can be hard to do if you have many products, but it really is necessary if you wish to avoid the duplicate content filter. Here's another example why using the Similar Page Checker is a great idea. It can tell you how you can change your descriptions so as to have unique and original content for your site. This also works well for scraped content also. Many scraped content sites offer news. With the Similar Page Checker, you can easily determine where the news content is similar, and then change it to make it unique.

Do not rely on an affiliate site which is identical to other sites or create identical doorway pages. These types of behaviors are not only filtered out immediately as spam, but there is generally no comparison of the page to the site as a whole if another site or page is found as duplicate, and get your entire site in trouble.

The duplicate content filter is sometimes hard on sites that don't intend to spam the search engines. But it is ultimately up to you to help the search engines determine that your site is as unique as possible. By using the tools in this article to eliminate as much duplicate content as you can, you'll help keep your site original and fresh.

Comments (0)
What is Robots.txt - Tuesday, September 06, 2011

Robots.txt

It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.

One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.

What Is Robots.txt?

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.

The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.

Structure of a Robots.txt File

The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:

User-agent:

Disallow:

User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:

# All user agents are disallowed to see the /temp directory.

User-agent: *

Disallow: /temp/

The Traps of a Robots.txt File

When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.

The more serious problem is with logical errors. For instance:

User-agent: *

Disallow: /temp/

User-agent: Googlebot

Disallow: /images/

Disallow: /temp/

Disallow: /cgi-bin/

The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.

Tools to Generate and Validate a Robots.txt File

Having in mind the simple syntax of a robots.txt file, you can always read it to see if everything is OK but it is much easier to use a validator, like this one: http://tool.motoricerca.info/robots-checker.phtml. These tools report about common mistakes like missing slashes or colons, which if not detected compromise your efforts. For instance, if you have typed:

User agent: *

Disallow: /temp/

this is wrong because there is no slash between “user” and “agent” and the syntax is incorrect.

In those cases, when you have a complex robots.txt file – i.e. you give different instructions to different user agents or you have a long list of directories and subdirectories to exclude, writing the file manually can be a real pain. But do not worry – there are tools that will generate the file for you. What is more, there are visual tools that allow to point and select which files and folders are to be excluded. But even if you do not feel like buying a graphical tool for robots.txt generation, there are online tools to assist you. For instance, the Server-Side Robots Generator offers a dropdown list of user agents and a text box for you to list the files you don't want indexed. Honestly, it is not much of a help, unless you want to set specific rules for different search engines because in any case it is up to you to type the list of directories but is more than nothing.

Comments (0)
Your Website from Google Banned to Google Unbanned - Tuesday, September 06, 2011

Even if you are not looking for trouble and do not violate any known Google SEO rule, you still might have to experience the ultimate SEO nightmare - being excluded from Google’s index. Although Google is a kind of a monopolist among search engines, it is not a bully company that excludes innocent victims for pure pleasure. Google keeps rigorously to SEO best practices and excludes sites that misbehave.

If you own and run a blog or website then being listed by Google is a very important step so it is read by as many people as possible; but what if your website gets Google banned? If this has happened to you, then you know that it hurts your site because you won’t show up in the Google search engine and that means less traffic to your site. Getting unbanned from Google is a long and drawn out process. And sometimes Google won’t even tell you the reason they banned your website in the first place, which doesn’t make things any easier.

Some of the ways a site can be Google banned include having spam on it, putting in too many keywords that clog up your site, making your owned URLs redirect to each other, improperly inserting a robot.txt file, duplicating your own pages and sending people to them over and over, and linking to bad sites like those with adult content, gambling or other unauthorized areas. There are multiple other reasons, so it’s a good idea to try to get them to let you know the reason for being Google banned. That will make it much simpler to fix the problem. Over-optimization has many faces and you can have a look at the Optimization, Over-Optimization or SEO Overkill? Article to get some ideas of practices that you should avoid.

Here are the necessary steps that you need to follow in order to get Google reconsideration for getting unbanned. Be sure to follow Google reconsideration request process precisely and correctly if you want to get your website unbanned and get your site back in business providing whatever products or services that it has:

1 Send an Google Reconsideration Request for getting Unbanned

Getting Google reinclusion of your website requires putting in a Google reconsideration request. First, the way you know your site is Google banned is that suddenly it doesn’t have a page rank on it. Then, in order to determine for sure that this is the case, enter your site at www.yoursite.com into Google, using whatever the name of your site is instead of the words yoursite. If you don’t see any of your pages there, then it’s likely you were Google banned.

Another way to tell if you are truly Google banned is to see if your pages show up in page indexing on Google. Or, if it is a news blog then you can go to www.googlenews.com and if you don’t see your articles there, you will also know you were probably banned from Google and now need to send a Google reconsideration request.

2 Be Polite to Google

Next, make remember that you are sending your Google reconsideration request to a real person who works for Google and someone will actually read your reconsideration request at Google office to be unbanned. Therefore you want to be polite and go into as much detail as possible, as it is better to give too much information than not enough in this situation. Being nice counts in this situation and if you act like a jerk, then it’s likely no one will want to help you.

3 Provide Information about the Domain

List things such as if it was a brand new domain name, tell them some background about your website, and also tell them the rules you think you may have broken. In case, there has been spam click on your account, get to the proofs of the same and write to them about it. This shows them you are serious about resolving the problem when sending the Google reconsideration suggestion. Put down everything that you think someone would need to know in order to know who you are and to jog their memory on why you were banned in the first place. Be sure to do your research so you will understand what his going on and can fully explain it to the Google representatives while sending the reconsideration request to Google.

4 Explain the Solution to the Past Problem

While sending the reconsideration request to Google, tell the representative what you have already done to fix the problem that caused you to be banned. Spell it out in detail and give them your actual page URL to prove it. It’s best to give as much information and data as you can so they will understand what you did to solve the issue. For example, if you had your site linked to bad links, then you must make sure that you remove every one of those and unlink them. Be sure to have removed all spam, or anything else that Google doesn’t approve or like. Then, prove to Google that you did this by showing them the evidence. Or, if you had invalid clicks, which is one of the common reasons to get Google banned, show why the clicks were valid. It takes all this sort of information in detail to make them understand the situation and help you to resolve it. Also, ensure that the changes now made to your website meet the requirements for Google reinclusion. Don’t do even a single thing on your website, which may annoy them.

5 Verify the Website

Next, login to your Google webmaster account and add and verify your site. Then go to http://www.google.com/webmasters/tools/reconsideration. This is the area that you use to put in your reconsideration request to Google to be unbanned. You can also send the information in an email to help@google.com. This is where Google representatives give support to customers. You may have to also sign up for Google webmaster tools once you are logged into your account if you don’t already have it

6 Provide Proof

It’s never a good idea to be an idiot and try to blame Google, or try to say you didn’t have any idea what you did wrong. You need real proof for Google reinclusion, not just blame or acting stupid. Show them the proof of the changes you made for Google reconsideration. And at last, always be considerate and thank them for the time and effort that they are taking to look into your reconsideration request to Google and help you to solve the problems and to get you unbanned from Google so your site can be relisted and you can keep getting the traffic you need to run your business, blog or news site.

7 Be Patient

It can take several weeks for a Google representative to get back with you and answer your Google reinclusion suggestion. They do have a lot of other things to handle and you need to understand that you aren’t the only one who may be having issues. While you are waiting, continue to look over your site and try to make sure all the alleged violations are fixed and good to go.

8 Send Follow Up Email for Google reconsideration

Be sure to send follow-up emails to Google to ask how the request is going and if they know when the situation will be resolved. You probably shouldn’t send one every day, because this could be regarded as you being a pest, but be sure to send one in periodically until you get an answer that you understand and can deal with to solve the Google banned problem.

All in all, it can be a time consuming and complicated process in order for your site to switch your site from Google banned to Google unbanned, but with the proper preparation and information, you should be well on your way to being in their good graces again. However, it’s well worth your efforts, so just follow these steps and Google should get back with you and fix your situation and your site.

Comments (0)
Web Directories and Specialized Search Engines - Tuesday, September 06, 2011

SEO experts spend most of their time optimizing for Google and occasionally one or two other search engines. There is nothing wrong in it and it is most logical, having in mind that topping Google is the lion's share in Web popularity but very often, no matter what you do, topping Google does not happen. Or sometimes, the price you need to pay (not literally but in terms of effort and time) to top Google and keep there is too high. Maybe we should mention here the ultimate SEO nightmare – being banned from Google, when you simply can't use Google (or not at least until you are readmitted to the club) and no matter if you like it or not, you need to have a look about possible alternatives.

What are Google Alternatives

The first alternative to Google is obvious – optimize for the other major search engines, if you have not done it already. Yahoo! and MSN (to a lesser degree) can bring you enough visitors, though sometimes it is virtually impossible to optimize for the three of them at the same time because of the differences in their algorithms. You could also optimize your site for (or at least submit to) some of the other search engines (Lycos, Excite, Netscape, etc.) but having in mind that they altogether hardly have over 3-5% of the Web search traffic, do not expect much.

Another alternative is to submit to search directories (also known as Web directories) and specialized search engines. Search directories might sound so pre-Google but submitting to the right directories might prove better than optimizing for MSN, for example. Specialized search engines and portals have the advantage that the audience they attract consists of people who are interested in a particular topic and if this is your topic, you can get to your target audience directly. It is true that specialized search engines will not bring you as many visitors, as if you were topping Google but the quality of these visitors is extremely high.

Naming all Google alternatives would be a long list and it is outside the scope of this article but just to be a little more precise about what alternatives exist, we cannot skip SEO instruments like posting to blogs and forums or paid advertisements.

Web Directories

What is a Web Directory?

Web directories (or as they are better known – search directories) existed before the search engines, especially Google, became popular. As the name implies, web directories are directories where different resources are gathered. Similarly to desktop directories, where you gather files in a directory based on some criterion, Web directories are just enormous collections of links to sites, arranged in different categories. The sites in a Web directory are listed in some order (most often alphabetic but it is not necessarily so) and users browse through them.

Although many Web directories offer a search functionality of some kind (otherwise it will be impossible to browse thousands of pages for let's say Computers), search directories are fundamentally different from search engines in the two ways – most directories are edited by humans and URLs are not gathered automatically by spiders but submitted by site owners. The main advantage of Web directories is that no matter how clever spiders become, when there is a human to view and check the pages, there is a lesser chance that pages will be classified in the wrong categories. The disadvantages of the first difference are that the lists in web directories are sometimes outdated, if no human was available to do the editing and checking for some time (but this is not that bad because search engines also deliver pages that do not exist anymore) and that sometimes you might have to wait half an year before being included in a search directory.

The second difference – no spiders – means that you must go and submit your URL to the search directory, rather than sit and wait for the spider to come to your site. Fortunately, this is done only once for each directory, so it is not that bad.

Once you are included in a particular directory, in most cases you can stay there as long as you wish to and wait for people (and search engines) to find you. The fact that a link to your site appears in a respectable Web directory is good because first, it is a backlink and second, you increase your visibility for spiders, which in turn raises your chance to be indexed by them.

Examples of Web Directories

There are hundreds and thousands of search directories but undoubtedly the most popular one is DMOZ. It is a general purpose search directory and it accepts links to all kinds of sites. Other popular general-purpose search directories are Google Directory and Yahoo! Directory. The Best of the Web is one of the oldest Web directories and it still keeps to high standards in selecting sites.

Besides general-purpose Web directories, there are incredibly many topical ones. For instance, the The Environment Directory lists links to environmental sites only, while The Radio Directory lists thousands of radio stations worldwide, arranged by country, format, etc. There are also many local and national Web directories, which accept links to sites about a particular region or country only and which can be great if your site is targeted at local and national audience only. You see, it is not possible to mention even the topics of specialized search directories only because the list will get incredibly long. Using Google and specialized search resources like The Search Engines Directory, you can find on your own many directories that are related to your area of interest.

Specialized Search Engines

What is a Specialized Search Engine?

Specialized search engines are one more tool to include in your SEO arsenal. Unlike general-purpose search engines, specialized search engines index pages for particular topics only and very often there are many pages that cannot be found in general-purpose search engines but only in specialized ones. Some of the specialized search engines are huge sites that actually host the resources they link to, or used to be search directories but have evolved to include links not only to sites that were submitted to them. There are many specialized search engines for every imaginable topic and it is always wise to be aware of the specialized search engines for your niche. The examples in the next section are by no means a full list of specialized search engines but are aimed to give you the idea of what is available. If you search harder on the Web, you will find many more resources.

Examples of Specialized Search Engines

Probably specialized search engines are not that numeric as Web directories but still certainly there is no shortage of them either, especially if one counts password-protected sites with database accessible only from within the site as a specialized search engine. As with Web directories, if there were a list of specialized search engines it would be really, really long (and constantly changing), so instead, here are some links to lists of search engines: Pandia Powersearch, Webquest, Virtual Search Engines, the already mentioned The Search Engines Directory, etc. What is common for these lists is that they offer a selection of specialized search engines, arranged by topic, so it is a good starting point for the hunt of specialized search engines.
Comments (0)
Optimizing for MSN - Tuesday, September 06, 2011

SEO experts often forget that there are three major search engines. While there is no doubt that Google is the number one with the most searches and Yahoo! manages to get about a quarter of the market, MSN has not retired yet. It holds about 10-15 percent of the searches (according to some sources even less – about 5%) but it has a loyal audience that can't be reached through the other two major search engines, so if you plan a professional SEO campaign, you can't afford to skip MSN. In a sense getting high rankings in MSN is similar to getting high rankings for less popular keywords – because competition is not that tough you might be able to get enough visitors from MSN only in comparison to the case when you have optimized for a more popular search engine.

Although optimizing for MSN is different from optimizing for Google and Yahoo!, there are still common rules that will help you to rank high in any search engine. As a rule, if you rank well in Google, chances are that you will rank well in Yahoo! (if you are interested in the tips and tricks for optimizing for Yahoo!, you want to have a look at the Optimizing for Yahoo! Article) and MSN as well. The opposite is not true, however. If you rank well in MSN, there is no guarantee that you'll do the same in Google. So, when you optimize for MSN, keep an eye on your Google ranking as well. It's no good to top MSN and be nowhere in Google (the opposite is more acceptable, if you need to make the choice).

But why is this so? The answer is simple - the MSN algorithm is different and that is why, even if the same pages were indexed, the search results will vary.

The MSN Algorithm

As already mentioned, it is the different MSN algorithm that leads to such drastic results in ranking. Otherwise, MSN, like all search engines, first spiders the pages on the Web, then indexes them in its database and after that applies the algorithm to generate the pages with the search results. So, the first step in optimizing for MSN is the same as for the other search engines – to have a spiderable site. (Have a look at Search Engine Spider Simulator to see how spiders see your site). If your site is not spiderable, then you don't have even a hypothetical chance to top the search results.

There is quite a lot of speculation about the MSN algorithm. Looking at the search results MSN delivers, it is obvious that its search algorithm is not as sophisticated as Google's, or even Yahoo!'s and many SEO experts agree that the MSN search algorithm is years behind its competitors. So, what can you do in this case? Optimize as you did for Google a couple of years ago? You are not far from the truth, though actually is is not that simple.

One of the most important differences is that MSN still relies heavily on metatags, as explained below. None of the other major search engines uses metatags that heavily anymore. It is obvious that metatags give SEO experts a great opportunity for manipulating search results. Maybe metatags are the main reason for the inaccurate search results that MSN often produces.

The second most important difference between MSN and the other major search engines is their approach to keywords. Well, for MSN keywords are very, very important, too, but unlike Google, for MSN onpage factors are dominating, while offpage factors (like backlinks for example), are still of minor importance. Well, it is a safe bet that the importance of backlinks will be changed in the future but for now they are not a primary factor for high rankings.

Keywords, Keywords, Keywords

It is hardly surprising that keywords are the most important item for MSN. What is surprising is that MSN relies too much on them. It is very easy to fool MSN – just artificially inflate your keyword density, put a couple of keywords in file names (and even better – in domain names) and around the top of the page and you are almost done for MSN. But if you do the above-mentioned black hat practices, your joy of topping MSN will not last long because, unless you provide separate pages that are optimized for Google, your stuffed pages might pretty well get you banned from Google. If you decide to have separate pages for Google and MSN, first, it it hardly worth the trouble, and second, the risk of duplicate content penalty can't be ignored.

So, what is the catch? The catch is that if you try to polish your site for MSN and stuff it with keywords, this might get you into trouble with Google, which certainly is worse than not ranking well in MSN. But if you optimize wisely, it is more likely than not that you will rank decently in Google and perform well in Yahoo! and MSN as well.

Metatags

Having meaningful metatags never hurts but with MSN this is even more important because its algorithm still uses them as a primary factor in calculating search results. Having well-written (not stuffed) metatags will help you with MSN and some other minor search engines, while at the same time well-written metatags will not get you banned from Google.

The Description metatag is very important:

<META NAME=”Description” CONTENT=”Place your description here” />

MSNBot reads its content and based on that (in addition to keywords found on page) judges how to classify your site. So if you leave this tag empty (i.e. CONTENT=””), you have missed a vital chance to be noticed by MSN. There is no evidence that MSN uses the other metatags in its algorithm that is why leaving the Description metatag empty is even more unforgivable.

Comments (0)
How to get Traffic from Social Bookmarking sites - Tuesday, September 06, 2011

Sites like digg.com, reddit.com, stumbleupon.com etc can bring you a LOT of traffic. How about getting 20,000 and more visitors a day when your listing hits the front page?
Getting to the front page of these sites is not as difficult as it seems. I have been successful with digg and del.icio.us (and not so much with Reddit though the same steps should apply to it as well) multiple times and have thus compiled a list of steps that have helped me succeed:


 

1Pay attention to your Headlines

Many great articles go unnoticed on social bookmarking sites because their headline is not catchy enough. Your headline is the first (and very often the only) thing users will see from your article, so if you don't make the effort to provide a catchy headline, your chances of getting to the front page are small.
Here are some examples to start with :-

Original headline : The Two Types of Cognition
Modified Headline : Learn to Understand Your Own Intelligence

Original headline: Neat way to organize and find anything in your purse instantly!
Modified Headline : How to Instantly Find Anything in Your Purse

Here is a good blog post that should help you with your headlines.

2Write a meaningful & short description

The headline is very important to draw attention but if you want to keep that attention, a meaningful description is vital. The description must be slightly provocative because this draws more attention but still, never use lies and false facts to provoke interest. For instance, if your write “This article will reveal to you the 10 sure ways to deal with stress once and forever and live like a king from now on.” visitors will hardly think that your story is true and facts-based.

You also might be tempted to use a long tell-it-all paragraph to describe your great masterpiece but have in mind that many users will not bother to read anything over 100-150 characters. Additionally, some of the social bookmarking sites limit descriptions, so you'd better think in advance how to describe your article as briefly as possible.

3Have a great first paragraph

This is a rule that is always true but for successful social bookmarking it is even more important. If you have successfully passed Level 1 (headlines) and Level 2 (description) in the Catch the User's Attraction game, don't let a bad first paragraph make them leave your site.

4Content is king

However, the first paragraph is not everything. Going further along the chain of drawing (and retaining) users' attention, we reach the Content is King Level. If your articles are just trash, bookmarking them is useless. You might cheat users once but don't count on repetitive visits. What is more, you can get your site banned from social bookmarking sites, when you persistently post junk.

5Make it easy for others to vote / bookmark your site

It is best when other people, not you, bookmark your site. Therefore, you must make your best to make it easier for them to do it. You can put a bookmarking button at the end of the article, so if users like your content, they can easily post it. If you are using a CMS, check if there is an extension that allows to add Digg, Del.icio.us, and other buttons but if you are using static HTML, you can always go to the social bookmarking site and copy the code that will add their button to your pages.
Here is a link that should help you add Links for Del.icio.us, Digg, and More to your pages.

6Know when to submit

The time when you submit can be crucial for your attempts to get to the front page. On most social bookmarking sites you have only 24 hours to get to the front page and stay there. So, if you post when most users (and especially your supporters) are still sleeping, you are wasting valuable time. By the time they get up, you might have gone to the tenth page. You'd better try it for yourself and see if it works for you but generally posting earlier than 10 a.m. US Central Time is not good. Many people say that they get more traffic around 3 p.m. US Central Time. Also, workdays are generally better in terms of traffic but the downside is that you have more competitors for the front page than on weekends.

7Submit to the right category

Sometimes a site might not work for you because there is no right category for you. Or because you don't submit to the right category – technology, health, whatever – but to categories like General, Miscellaneous, etc. where all unclassified stuff goes. And since these categories fill very fast, your chance to get noticed decreases.

8Build a top-profile

Not all users are equal on social bookmarking sites. If you are an old and respected user who has posted tons of interesting stuff, this increases the probability that what you submit will get noticed. Posting links to interesting articles on other sites is vital for building a top-profile. Additionally, it is suspicious, when your profile has links to only one site. Many social bookmarking sites frown when users submit their own content because this feels like self-promotion.

9Cooperate with other social bookmarkers

The Lonely Wolf is a suicidal strategy on sites like StubleUpon, Digg, Netscape. Many stories make it to the front page not only because they are great but because they are backed up by your network of friends. If in the first hours after your submittal you get at least 15 votes from your friends and supporters, it is more likely that other users will vote for you. 50 votes can get you to the top page of Digg.

10Submit in English

Linguistic diversity is great but the majority of users are from English-speaking countries and they don't understand exotic languages. So, for most of the social bookmarking sites submitting anything in a language different from English is not recommendable. The languages that are at an especial disadvantage are Chinese, Arabic, Slavic languages and all the other that use non-latin alphabet. German, Spanish, French are more understandable but still they are not English. If you really must submit your story (i.e. because you need the backlink), include an English translation at least of the title. But the best way to proceed with non-English stories is to post them on where they belong. Check this link for a list of non-English sites.

11Never submit old news

Submitting old news will not help you in becoming a respected user. Yesterday's news is history. But if you still need to submit old stuff, consider feature articles, howtos and similar pieces that are up-to-date for a long time.

12Check your facts

You must be flattered that users read your postings but you will hardly be flattered when users prove that you haven't got the facts right. In addition to sarcastic comments, you might also receive negative votes for your story, so if you want to avoid this, check you facts - or your readers will do it.

13Check you spelling

Some sites do not allow to edit your posts later, so if you misspell the title, the URL, or a keyword, it will stay this way forever.

14Not all topics do well

But sometimes even great content and submitting to the right category do not push you to the top. One possible reason could be that your stories are about unpopular topics. Many sites have topics that their users love and topics that don't sell that well. For instance, Apple sells well on Digg and The War in Iraq on Netscape. Negative stories - about George Bush, Microsoft, evil multinational companies, corruption and crime also have a chance to make it to the front page. You can't know these things in advance but some research on how many stories tagged with keywords like yours have made the front page in the last year or so can give you a clue.

15Have Related Articles / Popular Articles

Traffic gurus joke that traffic from social bookmarking sites is like an invasion – the crowds pour in and in a day or two they are gone. Unfortunately this is true – after your listing rolls from the front page (provided that you reached the front page), the drop in traffic is considerable. Besides, many users come just following the link to your article, have a look at it and then they are gone. One of the ways to keep them longer on your site is to have links to Related Articles / Popular Articles or something similar that can draw their attention to other stuff on the site and make them read more than one article.

16RSS feeds, newsletter subscriptions, affiliate marketing

RSS feeds, newsletter subscriptions, affiliate marketing are all areas in which the traffic from social bookmarking sites can help you a lot. Many people who come to your site and like it, will subscribe to RSS feeds and/or your newsletter. So, you need to put these in visible places and then you will be astonished at the number of new subscriptions you got on the day when you were on the front page of a major social bookmarking site.

17Do not use automated submitters

After some time of active social bookmarking, you will discover that you are spending hours on end posting links. Yes, this is a lot of time and using automated submitters might look like the solution but it isn't. Automated submitters often have malware in them or are used for stealing passwords, so unless you don't care about the fate of your profile and don't mind being banned, automated submitters are not the way to go.

18Respond to comments on your stories

Social bookmarking sites are not a newsgroup but interesting articles can trigger a pretty heated discussion with hundreds of comments. If your article gets comments, you must be proud. Always respond to commends on your stories and even better – post comments on other stories you find interesting. This is a way to make friends and to create a top-profile.

19Prepare your server for the expected traffic

This is hardly a point of minor importance but we take for granted that you are hosting your site on a reliable server that does not crash twice a day. But have in mind that your presence on the front page of a major social bookmarking site can drive you a lot traffic, which can cause your server to crash – literally!
I remember one of the times I was on the front page on Digg, I kept restarting Apache on my dedicated server because it was unable to cope with the massive traffic. I have many tools on my site and when the visitors tried them, this loaded the server additionally.
Well, for an articles site getting so much traffic is not so devastating but if you are hosting on a so-so server, you'd better migrate your site to a machine that can handle a lot of simultaneous hits. Also, check if your monthly traffic allowance is enough to handle 200-500,000 or even more visitors. It is very amateurish to attract a lot of visitors and not be able to serve them because your server crashed or you have exceeded your bandwidth!

20The snowball effect

But despite the differences in the likes of the different social bookmarking communities, there are striking similarities. You will soon discover that if a post is popular on one of the major sites, this usually drives it up on the other big and smaller sites. Usually it is Digg posts that become popular on StumbleUpon and Reddit but there are many other examples. To use this fact to your best advantage, you may want to concentrate your efforts on getting to the front page of the major players only and bet on the snowball effect to drive you to the top on other sites.
An additional benefit of the snowball effect is that if your posting is interesting and people start blogging about it, you can get tons of backlinks from their blogs. This happened to me and the result was that my PR jumped to 6 on the next update.

Comments (0)
Choosing SEO as Your Career - Tuesday, September 06, 2011

Its always better to know in advance what you can expect from a career in SEO.

Some Good Reasons to Choose SEO as Your Career

1High demand for SEO services

Once SEO was not a separate profession – Web masters performed some basic SEO for the sites they managed and that was all. But as sites began to grow and make money, it became more reasonable to hire a dedicated SEO specialist than to have the Web master do it. The demand for good SEO experts is high and is constantly on the rise.

2A LOT of people have made a successful SEO career

There are many living proofs that SEO is a viable business. The list is too long to be quoted here but some of the names include Rob from Blackwood Productions, Jill Wahlen from High Rankings, Rand Fishkin from SEO Moz and many others.

3Search Engine Optimizers make Good Money !

SEO is a profession that can be practiced while working for a company or as a solo practitioner. There are many jobboards like Dice and Craigslist that publish SEO job advertisements. It is worth noting that the compensation for SEO employees is equal to or even higher than that of developers, designers and marketers. Salaries over $80K per annum are not an exception for SEO jobs.
As a solo SEO practitioner you can make even more money. Almost all freelance sites have sections for SEO services and offers for $50 an hour or more are quite common. If you are still not confident that you can work on your own, you can start a SEO job, learn a bit and then start your own company.
If you already feel confident that you know a lot about SEO, you can take this quiz and see how you score. Well, don't get depressed if you didn't pass – here is a great checklist that will teach you a lot, even if you are already familiar with SEO.

4Only Web–Designing MAY NOT be enough

Many companies offer turn–key solutions that include Web design, Web development AND SEO optimization. In fact, many clients expect that when they hire somebody to make their site, the site will be SEO friendly, so if you are good both as a designer and a SEO expert, you will be a truely valuable professional.
On the other hand, many other companies are dealing with SEO only because they feel that this way they can concentrate their efforts on their major strentgh – SEO, so you can consider this possibility as well.

5Logical step ahead if you come from marketing or advertising

The Web has changed the way companies do business, so to some extent today's marketers and advertisers need to have at least some SEO knowledge if they want to be successful. SEO is also a great career for linguists.

6Lots of Learning

For somebody who comes from design, development or web administration, SEO might look not technical enough and you might feel that you will downgrade if you move to SEO. Don't worry so much – you can learn a LOT from SEO, so if you are a talented techie, you are not downgrading but you are actually upgrading your skills packages.

7SEO is already recognized as a career

Finally, if you need some more proof that SEO is a great career, have a look at the available SEO courses and exams for SEO practitioners. Well, they might not be a CISCO certification but still they help to institutionalize the SEO profession.

Some Ugly Aspects of SEO

1Dependent on search engines

It is true that in any career there are many things that are outside of your control but for SEO this is a rule number one. Search engines frequently change their algorithms and what is worse – these changes are not made public, so even the greatest SEO gurus admit that they make a lot of educated guesses about how things work. It is very discouraging to make everything perfect and then to learn that due to a change in the algorithm, your sites dropped 100 positions down. But the worst part is that you need to communicate this to clients, who are not satisfied with their sinking ratings.

2No fixed rules

Probably this will change over time but for now the rule is that there are no rules – or at least not written ones. You can work very hard, follow everything that looks like a rule and still success is not coming. Currently you can't even rely on bringing a search engine to court because of the damages they have done to your business because search engines are not obliged to rank high sites that have made efforts to get optimized.

3Rapid changes in rankings

But even if you somehow manage to get to the top for a particular keyword, keeping the position requires constant efforts. Well, many other businesses are like that, so this is hardly a reason to complain – except when an angry customer starts shouting at you that this week their ratings are sinking and of course this is all your fault.

4SEO requires Patience

The SEO professional and customers both need to understand that SEO takes constant effort and time. It could take months to move ahead in the ratings, or to build tens of links. Additionally, if you stop optimizing for some time, most likely you will experience a considerable drop in ratings. You need lots of motivation and patience not to give up when things are not going your way.

5Black hat SEO

Black hat SEO is probably one of the biggest concerns for the would–be SEO practitioner. Fraud and unfair competition are present in any industry and those who are good and ethical suffer from this but black hat SEO is still pretty widespread. It is true that search engines penalize black hat practices but still black hat SEO is a major concern for the industry.

So, let's hope that by telling you about the pros and cons of choosing SEO as your career we have helped you make an informed decision about your future.

Comments (0)
SEO Careers during a Recession - Tuesday, September 06, 2011

I don't know if many people became SEO experts because they planned ahead and thought that SEO careers are relatively stable in the long run, especially when compared to other business areas, or the reasons to make a career in SEO were completely different, but my feeling is that SEO experts are lucky now. Why? Because while the recession makes many industries wrench in pain, many SEO professionals are in top financial shape and full of optimism for the future.

It will be an exaggeration to say that the SEO industry doesn't feel the recession. This is not exactly so but when compared to industries such as automobiles, newspapers, banking, real estate, etc., SEO looks like a coveted island of financial security. This doesn't mean that there is no drop in volumes and everybody in SEO is working for top dollar but as a whole the SEO industry and the separate individuals, who make their living in SEO, are much better than many other employees and entrepreneurs.

What Can You Expect from Your SEO Career During a Recession?

The question about what realistic expectations are is fundamental. I bet there are people in SEO, who are not very happy with their current situation and blame the recession for that. Well, if most of your clients were from troubled industries (cars, real estate, financial services, etc.), then you do have a reason to complain. In such cases you should be happy if you can pay the bills. What you can do (if you haven't already done it) is to look for new customers from other industries.

Another factor that influences your expectations about your SEO career during the recession is your position on the career ladder. It makes a big difference if you work for a company or you are your own boss. Being an employee has always been a more vulnerable position, so if you expect job security, this is easier to achieve when you ares an independent SEO contractor. Mass layoffs might not the common for SEO companies but hired workers are never immune against it.

Additionally, your skill level also affects how your SEO carer will be influenced by the recession. The recession is not the right time for novices to enter SEO. Many people from other industries rush to SEO as a life belt. When these people don't have the right skills and expertise but expect rivers of gold, this inevitably leads to disappointment.

What Makes SEO Careers Recession-Proof?

So, if you are a seasoned SEO practitioner and you don't dream of rivers of gold, you can feel safe with SEO because unlike careers in many other industries SEO careers are relatively recession-proof. Here are some of the reasons why SEO careers are recession-proof:

  • The SEO market is an established market. If you remember the previous recession from the beginning of the century, when the IT industry was among the most heavily stricken, you might be skeptical a bit that now it won't be the same story. No, it is not the same now. SEO is not a new service anymore and the SEO market itself is more established than it was a couple of years ago. This is what makes the present recession different from the previous one – the difference is fundamental and it can't be neglected.

  • SEO is one of the last expenses companies cut. SEO has already become a necessity for companies of any size. Unlike hardware, cars, not to mention entertainment and even business trips, SEO expenses are usually not that big but they help a company to stay aboard. That is why when a company decides to make cuts in the budget, SEO expenses are usually not among the things that get the largest cut (or any cut at all).

  • SEO has great ROI. The Return On Investment (ROI) for money spent on SEO is much higher than the ROI for other types of investments. SEO brings companies money and this is what makes it such a great investment. Stop SEO and the money stops coming as well.

  • Many clients start aggressive SEO campaigns in an attempt to get better results fast. During a recession SEO is even more important. That is why many clients decide that an aggressive SEO campaign will help them get more clients and as a result these clients double their pre-recession budgets.

  • SEO is cheaper than PPC. SEO is just one of the many ways for a site to get traffic. However, it is also one of the most effective ways to drive tons of traffic. For instance, if you consider PPC, the cost advantages of SEO are obvious. PPC is very expensive and as a rule, ranking high in organic search results even for competitive keywords is cheaper than PPC.

  • Cheaper than traditional promotion methods. Traditional promotion methods (i.e. offline marketing) are still an option but their costs are higher even than PPC and the other forms of online promotion. Besides many companies have given offline marketing completely and have turned to SEO as their major way to promote their business and attract new clients.

  • SEO is an recurring expense. Many businesses build their business model around memberships and other forms of recurring payments. For you memberships and other types of recurring payments are presold campaigns – i.e. more or less you know that if the client is happy with a campaign you did for him, he or she will return. Acquiring recurring clients is very beneficial because you have less expenses in comparison to acquiring clients one by one.

The outlook for SEO careers during times of recession is pretty positive. As we already mentioned, it is possible to experience drops in volumes or some of your clients to go the bankruptcy road but as a whole SEO offers more stability than many other careers. If you manage to take advantage of the above mentioned recession-proof specifics of SEO and you are a real professional, you won't have the pleasure to feel the recession in all its bitterness.

Comments (0)
HTML 5 and SEO - Tuesday, September 06, 2011

HTML 5 is still in the making but for any SEO expert, who tries to look ahead, some knowledge about HTML 5 and how it will impact SEO is not unnecessary information. It is true that the changes and the new concepts in HTML 5 will impact Web developers and designers much more than SEO experts but still it is far from the truth to say that HTML 5 will not mean changes in the Organic SEO policy.

What's New in HTML 5?

HTML 5 follows the way the Net evolved in the last years and includes many useful tags and elements. At first glance, it might look as if HTML 5 is going in the direction of a programming language (i.e. PHP) but actually this is not so – it is still an XML–based presentation language. The new tags and elements might make HTML 5 look more complex but this is only at first glance.

HTML 5 is not very different from HTML 4. One of the basic ideas in the development of HTML 5 was to ensure backward compatibility and because of that HTML 5 is not a complete revamp of the HTML specification. So, if you had worries that you will have to start learning it from scratch, these worries are groundless.

How the Changes in HTML 5 Will Affect SEO?

As a SEO expert, you are most likely interested mainly in those changes in the HTML 5 specification, which will affect your work. Here are some of them:

  • Improved page segmentation. Search engines are getting smarter and there are many reasons to believe that even now they are applying page segmentation. Basically, page segmentation means that a page is divided into several separate parts (i.e. main content, menus, headers, footers, links sections, etc.) and these parts are treated as separate entries. At present, there is no way for a Web master to tell search engines how to segment a page but this is bound to change in HTML 5.

  • A new <article> tag. The new <article> tag is probably the best addition from a SEO point of view. The <article> tag allows to mark separate entries in an online publication, such as a blog or a magazine. It is expected that when articles are marked with the <article> tag, this will make the HTML code cleaner because it will reduce the need to use <div> tags. Also, probably search engines will put more weight on the text inside the <article> tag as compared to the contents on the other parts of the page.

  • A new <section> tag. The new <section> tag can be used to identify separate sections on a page, chapter, book. The advantage is that each section can have its separate HTML heading. As with the <article> tag, it can be presumed that search engines will pay more attention to the contents of separate sections. For instance, if the words of a search string are found in one section, this implies higher relevance as compared to when these words are found all across the page or in separate sections.

  • A new <header> tag. The new <header> tag (which is different from the head element) is a blessing for SEO experts because it gives a lot of flexibility. The <header> tag is very similar to the <H1> tag but the difference is that it can contain a lot of stuff, such as H1, H2, H3 elements, whole paragraphs of text, hard–coded links (and this is really precious for SEO), and any other kind of info you feel relevant to include.

  • A new <footer> tag. The <footer> tag might not be as useful as the <header> one but still it allows to include important information there and it can be used for SEO purposes as well. The <header> and <footer> tags can be used many times on one page – i.e. you can have a separate header/footer for each section and this gives really a lot of flexibility.

  • A new <nav> tag. Navigation is one of the important factors for SEO and everything that eases navigation is welcome. The new <nav> tag can be used to identify a collection of links to other pages.

As you see, the new tags follow the common structure of a standard page and each of the parts (i.e. header, footer, main section) has a separate tag. The tags we described here, are just some (but certainly not all) of the new tags in HTML 5, which will affect SEO in some way. For instance, <audio>, <video> or <dialogue> tags are also part of the HTML 5 standard and they will allow to further separate the content into the adequate categories. There are many other tags but they are of relatively lower importance and that is why they are not discussed.

For now HTML 5 is still far in the future. When more pages become HTML 5–compliant, search engines will pay more attention to HTML 5. Only then it will be possible to know how exactly search engines will treat HTML 5 pages. The mass adoption of HTML 5 won't happen soon and it is a safe bet to say that for now you can keep to HTML 4 and have no concerns. Additionally, it will take some time for browsers to adjust to HTML 5, which further delays the moment when HTML 5 will be everywhere.

However, once HTML 5 is accepted and put to use, it will be the dominating standard for the years to come and that is why you might want to keep an eye on what other web masters are doing, just to make sure that you will not miss the moment when HTML 5 becomes the defacto standard.

Comments (0)
How to get traffic from Twitter - Tuesday, September 06, 2011

Twitter is one of the latest and greatest Web 2.0 apps and it gets tons of traffic. However, from the point of view of a SEO expert, it is more important that Twitter can get you tons of traffic as well. So, if you still don't have an account with Twitter, you'd better open one.

Twitter is simple to use and this is what made it so popular. Twitter is fashionable right now, so enjoy the moment. Even the creators of Twitter admit that as with MySpace and other Web 2.0 sensations, Twitter will inevitably go out of fashion some day, so hurry up and get some traffic for free now, when it is still all the rage of the season.

Twitter is simple to use, yet it is really powerful. You might need a couple of hours to get familiar with the basic functionality of Twitter and of some of the extras it has but you can harness its power, even if you don't know it very well.

Unlike most of the other places you can get traffic for free, Twitter is a microblogging platform, which means that there are restrictions on the number of characters in a message. Therefore, you need to be concise in your Tweets and use your space wisely. In addition to being concise, here are some more tips to help you get traffic from Twitter:

1Make your Twitter profile interesting

Your profile and your username are the first two things your visitors will see when they go to your Twitter page. If your profile looks boring, people won't bother to read your tweets, not to mention visit the links you post in them. You can't write a very long bio of yours, but you can enter a few words about you – i.e. your occupation, your interests, etc. You can also include a couple of keywords in your bio.

2 Pick a niche-targeted username

Your username is also very important. You need to pick a username that is targeted at your niche. For instance, if you are promoting your SEO services and want to drive traffic to your SEO site, you can choose something like SEOmaster, SEOguru, SEOservices, etc. Your username will show in searches other users make and this is why you must pay attention to what you choose.

3 Put your site/blog URL in your profile

According to some statistics, 80% of tweeters don't provide an URL in their bio! Well, maybe these people are not SEO experts/Internet marketers and they don't need this traffic but you as a SEO expert can't afford to miss it. So, don't forget to include your URL in your profile!

4 Send the link to your profile to your friends, coworkers, and acquaintances

Your friends, coworkers, and acquaintances will be your most loyal audience, so if they don't know about your Twitter page, make them aware of what they are missing. If you have their emails, or know their accounts on other networks, you can send a mass invite.

5Search for Twitter users with similar interests

You might have millions of friends, but more followers are always welcome. That is why you can use the search functions on Twitter and find people with similar interests. Find as many as you can and invite them all. These people might not be as loyal as your friends, coworkers, and acquaintances but still you will get hits from them as well. Some Twitter users report that about 1-2% of their followers visit their site a day, which means that if you have 1,000 followers, you might expect to get at least 10 or 20 visits a day to your site. This response rate might seem low but there are ways to increase it.

6 Socialize on Twitter as much as you can

When you are active in Twitter, respond to the posts of your followers and visit their links, this seriously increases your chances that you will get the same in return. In a word, actively follow those that follow you.

7 Tweet regularly

As with all other kinds of media, if you want to keep your audience, you need to feed it regularly. Writing a short tweet takes just seconds, but it is enough in order to keep your followers happy. It goes without saying, that you should tweet about useful things, so if you don't have something meaningful to post about you or your sites, it is quite OK to post a link to an article, a video, a blog, etc. you found on the Net and you like.

7Don't spam

You might feel that every single user on Twitter is interested in you and your blog/site but this is not exactly so. You might be tempted to make as many users as you can aware of your Twitter page and your latest tweets but you'd better refrain from doing this, unless you want to see if you can get a ban or not.

8Take advantage of Twitterfeed

Twitterfeed is one more useful service you can take advantage of in order to increase your reach. Go to twitterfeed.com and configure your feeds.

9Make Twitter Search love you

Twitter has a great search function and its main advantage is that it offers real-time results. Google might be fast in indexing pages but its indexing is not real-time. Users are hungry for hot news and nothing beats a real-time search. Many bloggers report that they are getting more traffic from Twitter than from Google and partially this is due to the fact that their tweets are popular and users find them with ease.

10Add Twitter gadgets to your site

There are tons of Twitter gadgets and new ones are being released every day. The cool thing about Twitter gadgets is that your blog visitors can become your Twitter followers. If your Twitter followers have many followers, chances are that some of these followers will notice you and will join your network. As we already mentioned, building a large and targeted network is key to getting more Twitter traffic to your site.

These are some of the main ways in which you can get traffic from Twitter. If you are creative and if you monitor what's going on on Twitter and what new Twitter gadgets are released, you will certainly find more ways to drive traffic from Twitter to your site.

Comments (0)
Make money from your website - Tuesday, September 06, 2011

For millions of people web sites are the major source of income. Thanks to the various ways to monetize a site, it is possible to make a living as a web master. However, this doesn't mean that every web master is a millionaire. Well, if you know how to monetize your web site, you might not get rich but at least chances are that under the worst possible scenario, you will be able to cover your hosting fees and the other expenses you make for your site.

Currently the most profitable ways to monetize your site are CPM ads, CPC ads, CPA ads, direct sales of ad space, and selling merchandise/goods/services. These ways are described in this article.

CPM Ads Can Bring In Cents

CPM ads (an abbreviation for Cost Per Thousand) is the oldest type of site monetization. You put banners on your site and advertisers pay you based on the number of unique impressions/page views your site has.

CPM ads are the easiest form of site monetization because they require almost no effort on your side, but they are also the least feasible because as a rule (unless you have really high traffic and your audience is well-targeted), the rates are low. Some CPM ad networks pay as low as $.05 per 1,000 uniques, which means that you need millions of visitors in order to make $100-200 a month.

Popups and layer ads pay a bit more ($.5-2 per 1,000 uniques/pageviews). They could bring you a few dollars a month but many visitors find them especially irritating and this is why many web masters are not willing to put popups and/or layer ads on their sites.

There are high-paying CPM ads – for instance ads paying $2-5 per 1,000 uniques but usually the networks that run them have very high traffic and quality requirements for the sites that are eligible. As a result, it is very hard to get into these networks and take advantage of these well-paid CPM ads.

There are many CPM networks we can recommend. For sites with lots of traffic the choice better and some of the good options are Advertising.com, TribalFusion, CasaleMedia, and ValueClick because there CPM rates are good. AdBrite, AdToll, Right Media, and BurstMedia are also good. Some of these networks are CPM-only, while others have other types of ads as well. Google also has CPM ads, so you may want to try them.

CPC Ads Are a Profitable Way to Monetize a Site

CPC ads (an abbreviation from Cost Per Click) are different from CPM because with CPC you get paid not when visitors view ads but when they click on them. The good news is that CPC rates are much, much better and as a result, it is possible to make a decent income even with a small site with not so high traffic. The key with CPC success is to have a well-targeted site in a niche where there are a lot of advertisers.

The most popular CPC network is Google Adsense and even though there are other CPC networks, the income you can make from Adsense is much higher. The reason is that Google Adsense has many advertisers and if your site is in a profitable niche, CPC can be the best way to monetize your site. Additionally, unlike some of the other CPC ad networks, Google Adsense is open to publishers from all over the world.

Usually CPC ads are text ads and you publish them in blocks. However, there are also intext ads, where the keyword is underlined and when the visitor puts the mouse on it, a tooltip with CPC ads appears. Intext ads are less obtrusive but it takes much more effort for the user to notice them (and above all – to click them), which means that your chances to make money are further decreased. Kontera is one of the most popular networks for intext CPC ads.

The list of good CPC ad networks is not as long as the list of CPM or CPA ad networks but still there is a choice. For instance, Google Adsense, Yahoo! Publisher Network (YPN), BidVertiser, Chitika, and Clicksor are generally considered top choices but since many CPM ad networks pay decent rates for clicks on their CPM ads, you might want to try them as well.

CPA Ads Could Make You Rich

Currently CPA (an abbreviation from Cost Per Action) is the most profitable way to monetize a site in a profitable niche. CPA, also known as “affiliate programs”, pays you a commission when your visitors perform an action. Most often this action is a purchase, but it could also be something else – i.e. download a free trial, or signup for the service of the advertiser.

Affiliate programs can make you rich because there are many products with really fat commissions. For niches such as health, finance, travel, etc. affiliate programs are a real golden mine.

However, affiliate programs require a lot of efforts on your side and still there is no guarantee that the offers you pick to promote will convert well and will make you money. Sometimes even the highest converting offers won't convert on your site and the only thing you can do is replace them with other offers, hoping that they will convert better.

There might be hundreds of CPM ad networks but for CPA they are thousands. It is practically impossible to try all of them personally and this is why we would like to recommend some of the best to start with. Amazon, eBay, ShareASale, Commission Junction, Clickbank, Max Bounty, Azoogle, Never Blue Ads, LinkShare, or PepperJam are just a few of the greatest CPA networks we can recommend.

Direct Sales of Ads

If you are not happy with the CPM rates of ad networks, or CPC and CPA don't convert well with your site, you could try to find direct advertisers. However, have in mind that such an endeavor is not necessarily bound to succeed and if your site doesn't have audience advertisers can't reach elsewhere, you will hardly be able to negotiate good prices. Still, for some sites direct sales of ad space are a viable alternative and this is why you could also try it.

Openads Ad Server and OIO Publisher Ad Platform are two of the sites where you could try your luck. AdBrite also allowa you to price your ad space. In fact, almost any major network gives you the chance to put a widget on your site to invite advertisers directly to advertise on your site.

Sell Merchandise/Goods/Services

In addition to CPA ads, where you are selling other merchants' products, you can try selling your own merchandise or products/services with your brand. This technique works well mainly for popular sites with loyal audience and is hardly the easiest way to monetize a site. You could try to sell merchandise/goods/services as a supplementary service and if you see that this monetization technique works, you can expand the business. CafePress is one of the best places where you can sell merchandise with your logo and the greatest thing is that they print on demand, which means you don't have to keep your merchandise in stock.

The monetization techniques we described here can be combined. You can run CPM ads together with CPA or CPC. You can also combine multiple ads from the same type (i.e. CPM, CPC, or CPA) from different ad networks, provided that this doesn't violate the terms of these networks. There isn't a universal prescription about the best way to monetize a site. The basic rule is that you need to try and see what works for you. The fact that a given monetization technique works for somebody else doesn't mean that it will work for you, so you need to try and see for yourself.

Comments (0)
How to Pick an SEO Friendly Designer - Monday, September 05, 2011
It is very important to hire a SEO-friendly designer because if you don't and your site is designed in a SEO-unfriendly fashion, you can't compensate for this later. This article will tell you how to pick a SEO-friendly designer and save yourself the disappointment of low rankings with search engines.

A Web designer is one of the persons without whom it is not possible to create a site. However, when SEO is concerned, Web designers can be really painful to deal with. While there are many Web designers, who are SEO-proficient, it is still not an exception to stumble upon design geniuses, who are focused only on the graphic aspect of the site. For them SEO is none of their business and they couldn't care less for something as unimportant as good rankings with search engines. Needless to say, if you hire such a designer, don't expect that your site will rank well with search engines.

If you will do SEO on your own, then you might not care a lot about the SEO skills of your Web designer but still there are design issues as we'll see next, which can affect your rankings very badly. When he or she designs the site against SEO rules, then it is not possible to fix this with SEO tricks.

When we say that you need to hire a SEO-friendly designer, we presume that you are a SEO pro and you know SEO but if you aren't, then have a look at the SEO Tutorial and the SEO Checklist. If you have no idea about SEO, then you will hardly be able to select a SEO-friendly designer because you won't know what to look for.

One of the ultimate tests if a designer is SEO-friendly or not is to look at his or her past sites – are they done professionally, especially in the SEO department. If their past sites don't exhibit blatant SEO mistakes, such as the ones we'll list in a second and they rank well, this is a recommendation that this person is worth hiring. Anyway, after you look at past sites, ask the designer if he or she did the SEO for their past sites because in some cases it might be that the client himself or herself has done a lot to optimize the site and this is why the site ranks well.

Here is a checklist of common web design sins that will make your site a SEO disaster. If you notice any or all of the following in the past sites your would-be designer has created, just move to the next designer. These SEO-unfriendly design elements are absolute sins and unless the client made them do it, no designer who would use the below techniques deserves your attention:

1 Rely heavily on Flash

Many designers still believe that Flash is the next best thing after sliced bread. While Flash can be very artistic and make a site look cool (and load forever in the browser), heavily Flash-ed sites are disaster in terms of SEO. Simple HTML sites rank better with search engines and as we point out in Optimizing Flash Sites, if the use of Flash is a must, then an HTML version of the same page is more than mandatory.

2 No internal links, or very few links

Internal links are backlinks and they are very important. Of course, this doesn't mean that all the text on a page must be hyperlinked to all the other pages on the site but if there are only a couple of internal links a page, this is a missed chance to get backlinks.

3 Images, not text for anchors

This is another frequent mistake many designers make. Anchor text is vital in SEO and when your links lack anchor text, this is bad. It is true that for menu items and other page elements, it is much easier to use an image than text because with text you can never be sure it will display correctly on users' screens, but since this is impacting your site's rankings in a negative way, you should sacrifice beauty for functionality.

4 Messy code and tons of code

If you have no idea about HTML, then it might be impossible for you to judge if a site's code is messy and if the amount of code is excessive but cleanness of code is an important criterion for SEO. When the code is messy, it might not be spiderable at all and this can literally exclude your site from search engines because they won't be able to index it.

5 Excessive use of (SEO non-friendly) JavaScript

Similarly to Flash, search engines don't love JavaScript, especially tons of it. Actually, the worst with JavaScript is that if not coded properly, it is quite possible that because of the use of JavaScript your pages (or parts of them) are not spiderable, which automatically means that they won't be indexed.

6 Overoptimized sites

Overoptimized sites aren't better than under-optimized. In fact, they could be much worse because when you keyword stuff and use other techniques (even when they are not Black Hat SEO) to artificially inflate the rankings of the site, this could get you banned from search engines and this is the worst that can happen to a site.

7 Dynamic and other SEO non-friendly URLs

Well, maybe dynamic URLs is not exactly a design issue but if you are getting a turn-key site - i.e. it is not up to you to upload and configure it and to create the links inside - then dynamic URLs are bad and you have to ask the designer/developer not to use them. You can rewrite dynamic and other SEO non-friendly URLs on your own but actually this means to make dramatic changes to the site and this is hardly the point of hiring a designer.

These points are very important and this is why you need to follow them, when you are choosing a SEO-friendly designer. Some of the items on the list are so bad for SEO (i.e. Flash, JavaScript) that even if the site is a design masterpiece and you promote it heavily, you will still be unable to get decent rankings. SEO-friendliness of design is a necessity, not a whim and you shouldn't settle for a SEO-unfriendly designs – this can be really expensive!

Comments (0)
LiveZilla Live Help